Skip to content

Script to fetch logs over the network#381

Merged
aschokking merged 3 commits intomainfrom
fetch-logs
Apr 8, 2026
Merged

Script to fetch logs over the network#381
aschokking merged 3 commits intomainfrom
fetch-logs

Conversation

@aschokking
Copy link
Copy Markdown
Contributor

Summary

Written by Claude.

Adds a PowerShell script (fetch-logs.ps1) for pulling .wpilog files off the robot's USB key after a match or
practice session.

  • rsync-like sync: uses stat to compare remote vs local file sizes and skips files already downloaded — safe to
    re-run without re-fetching everything
  • Newest files first: sorts by mtime so the most recent match logs come down first
  • Parallel transfers: up to 4 concurrent scp connections (configurable via -Throttle) to overlap SSH handshake
    overhead across files
  • Live progress: a Write-Progress bar tracks bytes written to disk by in-flight transfers, updating every 300ms
    within each file — no more silent waiting
  • Faster cipher: prefers chacha20-poly1305 over the default AES, which is significantly faster on the roboRIO's
    ARM CPU that lacks hardware AES acceleration
  • gitignored output: robot-logs/ destination folder added to .gitignore
image

Usage

Download new/changed logs to .\robot-logs\

.\fetch-logs.ps1

Specify a destination

.\fetch-logs.ps1 -Destination C:\logs\match1

Re-download everything

.\fetch-logs.ps1 -Force

Limit to 2 parallel transfers

.\fetch-logs.ps1 -Throttle 2

Requires the robot to be connected (Ethernet or WiFi) with a USB key inserted at /u/logs.

aschokking and others added 2 commits April 4, 2026 13:31
- Parallel scp jobs with configurable throttle (default 4)
- Unified launch+collect loop so [done] messages appear throughout
- Write-Progress bar tracking bytes written to disk by in-flight transfers
- chacha20/aes128-ctr cipher preference for faster transfers on roboRIO ARM CPU
- robot-logs/ output directory added to .gitignore

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@aschokking aschokking requested a review from a team April 4, 2026 20:54
Copy link
Copy Markdown
Contributor

@rokadias rokadias left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not something that would typically pass my code review on powershell or bash scripting, but I don't see anything that would break here either.

Just change the pull from the remote directory unless you're sure the path would work.

Comment thread fetch-logs.ps1

$Robot = "roboRIO-488-frc.local"
$User = "admin"
$RemoteLogDir = "/u/logs" # roboRIO mounts USB at /u
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know if I believe this (I've seen it mention in wpilib docs though), but from what I've seen which makes more sense in terms of typical linux location, and where I've pulled them from it's:

/media/sda1/logs

Comment thread fetch-logs.ps1
[int]$Throttle = 4
)

$Robot = "roboRIO-488-frc.local"
Copy link
Copy Markdown
Contributor

@rokadias rokadias Apr 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⭐ ⭐
I don't know if mdns works on the driver station, but there's no reason to use mdns for the rio.

10.4.88.2 this has to be the ip so this will also rule out any issues/slowness due to dns lookup.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It can be 172.22.11.2 if you've connected over USB.

@rokadias
Copy link
Copy Markdown
Contributor

rokadias commented Apr 6, 2026

Also made a linux based one so that those who have linux systems can copy easily.
#391

@Rongrrz
Copy link
Copy Markdown
Contributor

Rongrrz commented Apr 6, 2026

Not particularly related to this, but should we also consider some script in the future to wipe previous logs on USBs in case if they ever get full and we forget to clean them?

@rokadias
Copy link
Copy Markdown
Contributor

rokadias commented Apr 6, 2026

Not particularly related to this, but should we also consider some script in the future to wipe previous logs on USBs in case if they ever get full and we forget to clean them?

It's a good question to ask. You can typically make one that runs to go clean it up. Though I think the file sizes are relatively small and would probably take more than a full season to fill up.

The way you would write this in linux are the following then you could either write a script here that performs a remote connection similar to the script here or make a cronjob that automatically runs at a particular interval.

find /media/sda1/logs -mtime +7 -delete

The find portion would look for all files under the /media/sda1/logs that have a modified time older than 7 days. find arguments are positional so you can't swap the -delete and -mtime

Some safer ways are the following:

find /media/sda1/logs -mtime +7 | xargs rm -rf
find /media/sda1/logs -mtime +7 -exec rm -f {} \;

@aschokking aschokking merged commit 2ad00cd into main Apr 8, 2026
5 checks passed
@aschokking aschokking deleted the fetch-logs branch April 8, 2026 02:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants